# Korean-English bilingual generation
Gemma Ko 7b
Other
Gemma-Ko is a Korean large language model developed based on Google's Gemma model, offering a 7B parameter version suitable for Korean and English text generation tasks.
Large Language Model
Transformers Supports Multiple Languages

G
beomi
381
49
Pivot 0.1 Early
PiVoT is a model fine-tuned from Mistral 7B, derived from Synatra v0.3 RP variant, demonstrating good performance.
Large Language Model
Transformers Supports Multiple Languages

P
maywell
1,851
8
42dot LLM SFT 1.3B
A 1.3B parameter instruction-following large language model developed by 42dot, based on the supervised fine-tuned version of LLaMA 2 architecture
Large Language Model
Transformers Supports Multiple Languages

4
42dot
676
37
Featured Recommended AI Models